Skip to main content

Information-theoretic properties of languages and their grammars

Published August 1984

Title from cover

"Prepared for: Chief of Naval Research"--Cover

"August 1984"--Cover


DTIC Identifiers: Strings

Author(s) key words: Entropy, negentropy, information theory, language metrics, quantitative linguistics, language complexity, information content of languages, average derivation length, average information used by grammars, average string length, information density, frequency of occurrence, annotated grammar

Includes bibliographical references (p. 47)

Technical report; 1984

This document describes means for computing a number of information-theoretic properties of languages and their grammars. For example, the entropy of a system of symbols is widely recognized as a measure of that system's complexity and organization. It is shown how the entropy of a language can be computed in a simple way from a grammar annotated with production probabilities. The author then develops means for statistically estimating these production probabilities from measureable properties of strings in the language. He also considers the computation of other information theoretic properties of language and grammars, such as the average information born by a symbol in a language and the average information used by the productions of a grammar. (Author)

aq/ /aq cc:9116 09/12/97

kmc/kmc 10/28/09

Publisher Monterey, California : Naval Postgraduate School
Pages 58
Language en_US
Call number ocn460637571
Digitizing sponsor Naval Postgraduate School, Dudley Knox Library
Book contributor Naval Postgraduate School, Dudley Knox Library
Collection navalpostgraduateschoollibrary; fedlink; americana

Full catalog record MARCXML

[Open Library icon]This book has an editable web page on Open Library.


There are no reviews yet. Be the first one to write a review.
Naval Postgraduate School, Dudley Knox Library
by Hixenbaugh, Milady Blaha.;Hixenbaugh, Paul Noel.
Source: half
Source: half